Search results for "Total correlation"

showing 8 items of 8 documents

Visual information flow in Wilson-Cowan networks.

2020

In this paper, we study the communication efficiency of a psychophysically tuned cascade of Wilson-Cowan and divisive normalization layers that simulate the retina-V1 pathway. This is the first analysis of Wilson-Cowan networks in terms of multivariate total correlation. The parameters of the cortical model have been derived through the relation between the steady state of the Wilson-Cowan model and the divisive normalization model. The communication efficiency has been analyzed in two ways: First, we provide an analytical expression for the reduction of the total correlation among the responses of a V1-like population after the application of the Wilson-Cowan interaction. Second, we empiri…

Normalization (statistics)PhysiologyComputer scienceComputationPopulationModels Biological050105 experimental psychologyRetina03 medical and health sciencesWilson–Cowan equations0302 clinical medicineMulti-informationtotal correlationHumans0501 psychology and cognitive sciencesVisual PathwaysEfficient coding hypothesisEfficient representation principleeducationVisual Cortexeducation.field_of_studyNormalization modelGeneral Neuroscience05 social sciencesUnivariateFOS: Biological sciencesQuantitative Biology - Neurons and CognitionDivisive normalizationVisual PerceptionNeurons and Cognition (q-bio.NC)Total correlationNeural Networks ComputerNerve NetAlgorithm030217 neurology & neurosurgeryImage compressionJournal of neurophysiology
researchProduct

Gaussianizing the Earth: Multidimensional Information Measures for Earth Data Analysis

2021

Information theory is an excellent framework for analyzing Earth system data because it allows us to characterize uncertainty and redundancy, and is universally interpretable. However, accurately estimating information content is challenging because spatio-temporal data is high-dimensional, heterogeneous and has non-linear characteristics. In this paper, we apply multivariate Gaussianization for probability density estimation which is robust to dimensionality, comes with statistical guarantees, and is easy to apply. In addition, this methodology allows us to estimate information-theoretic measures to characterize multivariate densities: information, entropy, total correlation, and mutual in…

FOS: Computer and information sciencesMultivariate statisticsGeneral Computer ScienceComputer scienceMachine Learning (stat.ML)Mutual informationInformation theorycomputer.software_genreStatistics - ApplicationsEarth system scienceRedundancy (information theory)13. Climate actionStatistics - Machine LearningGeneral Earth and Planetary SciencesEntropy (information theory)Applications (stat.AP)Total correlationData miningElectrical and Electronic EngineeringInstrumentationcomputerCurse of dimensionality
researchProduct

Implementation of the full explicitly correlated coupled-cluster singles and doubles model CCSD-F12 with optimally reduced auxiliary basis dependence.

2008

An implementation of the full explicitly correlated coupled-cluster singles and doubles model CCSD-F12 using a single Slater-type geminal has been obtained with the aid of automated term generation and evaluation techniques. In contrast to a previously reported computer code [T. Shiozaki et al., J. Chem. Phys. 129, 071101 (2008)], our implementation features a reduced dependence on the auxiliary basis set due to the use of a reformulated evaluation of the so-called Z-intermediate rather than straight forward insertion of an auxiliary basis expansion, which allows an unambiguous comparison to more approximate CCSD-F12 models. First benchmark results for total correlation energies and reactio…

Source codeGeminalBasis (linear algebra)Chemistrymedia_common.quotation_subjectGeneral Physics and AstronomyTerm (time)Coupled clusterBenchmark (computing)Applied mathematicsTotal correlationPhysical and Theoretical ChemistryAtomic physicsBasis setmedia_commonThe Journal of chemical physics
researchProduct

Computing variations of entropy and redundancy under nonlinear mappings not preserving the signal dimension: quantifying the efficiency of V1 cortex

2021

In computational neuroscience, the Efficient Coding Hypothesis argues that the neural organization comes from the optimization of information-theoretic goals [Barlow Proc.Nat.Phys.Lab.59]. A way to confirm this requires the analysis of the statistical performance of biological systems that have not been statistically optimized [Renart et al. Science10, Malo&Laparra Neur.Comp.10, Foster JOSA18, Gomez-Villa&Malo J.Neurophysiol.19]. However, when analyzing the information-theoretic performance, cortical magnification in the retina-cortex pathway poses a theoretical problem. Cortical magnification stands for the increase the signal dimensionality [Cowey&Rolls Exp. Brain Res.74]. Conventional mo…

symbols.namesakeWaveletRedundancy (information theory)Dimension (vector space)Computer scienceJacobian matrix and determinantsymbolsEntropy (information theory)Total correlationEfficient coding hypothesisAlgorithmCurse of dimensionalityProceedings of Entropy 2021: The Scientific Tool of the 21st Century
researchProduct

Anisotropic pair correlations in binary and multicomponent hard-sphere mixtures in the vicinity of a hard wall: A combined density functional theory …

2015

The fundamental measure approach to classical density functional theory has been shown to be a powerful tool to predict various thermodynamic properties of hard-sphere systems. We employ this approach to determine not only one-particle densities but also two-particle correlations in binary and six-component mixtures of hard spheres in the vicinity of a hard wall. The broken isotropy enables us to carefully test a large variety of theoretically predicted two-particle features by quantitatively comparing them to the results of Brownian dynamics simulations. Specifically, we determine and compare the one-particle density, the total correlation functions, their contact values, and the force dis…

PhysicsStatistical Mechanics (cond-mat.stat-mech)IsotropyFOS: Physical sciencesHard spheresCondensed Matter - Soft Condensed MatterMeasure (mathematics)Brownian dynamicsCompressibilitySoft Condensed Matter (cond-mat.soft)Density functional theoryTotal correlationStatistical physicsAnisotropyCondensed Matter - Statistical Mechanics
researchProduct

Total Correlation Spectroscopy (TOCSY) of Proteins Using Coaddition of Spectra Recorded with Several Mixing Times

1993

Nuclear magnetic resonanceChemistryGeneral EngineeringAnalytical chemistryPulse sequenceTotal correlationNuclear Overhauser effectNuclear magnetic resonance spectroscopySpectroscopySpectral lineMixing (physics)Journal of Magnetic Resonance, Series B
researchProduct

Measuring Functional Connectivity of Human Intra-Cortex Regions with Total Correlation

2021

The economy of brain organization makes the primate brain consume less energy but efficiency. The neurons densely wired each other dependent on both anatomy structure connectivity and functional connectivity. Here, I only describe functional connectivity with Functional Magnetic Resonance Imaging (fMRI) data. Most importantly, how to quantitative measure information share or separate among functional brain regions, what’s worse, fMRI data exist large dimensional problems or “curse dimensionality” [1]. However, the multivariate total correlation method can perfectly address the above problems. In this paper, two things measured with the information-theoretic technique - total correlation [2,…

medicine.diagnostic_testSeries (mathematics)Computer sciencebusiness.industrymedia_common.quotation_subjectComplex systemPattern recognitionPerceptionmedicineTotal correlationArtificial intelligenceEntropy (energy dispersal)Functional magnetic resonance imagingbusinessEnergy (signal processing)Curse of dimensionalitymedia_commonProceedings of Entropy 2021: The Scientific Tool of the 21st Century
researchProduct

Functional connectivity inference from fMRI data using multivariate information measures

2022

Abstract Shannon’s entropy or an extension of Shannon’s entropy can be used to quantify information transmission between or among variables. Mutual information is the pair-wise information that captures nonlinear relationships between variables. It is more robust than linear correlation methods. Beyond mutual information, two generalizations are defined for multivariate distributions: interaction information or co-information and total correlation or multi-mutual information. In comparison to mutual information, interaction information and total correlation are underutilized and poorly studied in applied neuroscience research. Quantifying information flow between brain regions is not explic…

Brain MappingComputer scienceEntropyCognitive NeuroscienceConditional mutual informationBrainMultivariate normal distributionMutual informationcomputer.software_genreMagnetic Resonance ImagingInteraction informationRedundancy (information theory)Artificial IntelligenceEntropy (information theory)Computer SimulationTotal correlationInformation flow (information theory)Data miningcomputerNeural Networks
researchProduct